skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Serrano, Guillermo"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. The increasing number of Photovoltaic (PV) systems connected to the power grid are vulnerable to the projection of shadows from moving clouds. Global Solar Irradiance (GSI) forecasting allows smart grids to optimize the energy dispatch, preventing energy shortages caused by occlusion of the sun. This investigation compares the performances of machine learning algorithms (not requiring labelled images for training) for realtime segmentation of clouds in images acquired using a ground-based infrared sky imager. Real-time segmentation is utilized to extract cloud features using only the pixels in which clouds are detected. 
    more » « less
  2. Moving clouds affect the Global Solar Irradiance (GSI) that reaches the surface of the Earth. As a consequence, the amount of resources available to meet the energy demand in a smart grid powered using Photovoltaic (PV) systems depends on the shadows projected by passing clouds. This research introduces an algorithm for tracking clouds to predict Sun occlusion. Using thermal images of clouds, the algorithm is capable of estimating multiple wind velocity fields with different altitudes, velocity magnitudes and directions. 
    more » « less
  3. Mathelier, Anthony (Ed.)
    Abstract Motivation An important step in the transcriptomic analysis of individual cells involves manually determining the cellular identities. To ease this labor-intensive annotation of cell-types, there has been a growing interest in automated cell annotation, which can be achieved by training classification algorithms on previously annotated datasets. Existing pipelines employ dataset integration methods to remove potential batch effects between source (annotated) and target (unannotated) datasets. However, the integration and classification steps are usually independent of each other and performed by different tools. We propose JIND (joint integration and discrimination for automated single-cell annotation), a neural-network-based framework for automated cell-type identification that performs integration in a space suitably chosen to facilitate cell classification. To account for batch effects, JIND performs a novel asymmetric alignment in which unseen cells are mapped onto the previously learned latent space, avoiding the need of retraining the classification model for new datasets. JIND also learns cell-type-specific confidence thresholds to identify cells that cannot be reliably classified. Results We show on several batched datasets that the joint approach to integration and classification of JIND outperforms in accuracy existing pipelines, and a smaller fraction of cells is rejected as unlabeled as a result of the cell-specific confidence thresholds. Moreover, we investigate cells misclassified by JIND and provide evidence suggesting that they could be due to outliers in the annotated datasets or errors in the original approach used for annotation of the target batch. Availability and implementation Implementation for JIND is available at https://github.com/mohit1997/JIND and the data underlying this article can be accessed at https://doi.org/10.5281/zenodo.6246322. Supplementary information Supplementary data are available at Bioinformatics online. 
    more » « less
  4. null (Ed.)